Mean Field Theory for Random Recurrent Spiking Neural Networks
نویسندگان
چکیده
Recurrent spiking neural networks can provide biologically inspired model of robot controller. We study here the dynamics of large size randomly connected networks thanks to ”mean field theory”. Mean field theory allows to compute their dynamics under the assumption that the dynamics of individual neuronsare stochastically independent. We restrict ourselves to the simple case of homogeneous centered gaussian independent synaptic weights. First a theoretical study allows to derive the mean-field dynamics using a large deviation approach. This dynamics is characterized in function of an order parameter which is the normalized variance of the coupling. Then various applications are reviewed which show the applicative potentiality of the approach.
منابع مشابه
Locking of correlated neural activity to ongoing oscillations
Population-wide oscillations are ubiquitously observed in mesoscopic signals of cortical activity. In these network states a global oscillatory cycle modulates the propensity of neurons to fire. Synchronous activation of neurons has been hypothesized to be a separate channel of signal processing information in the brain. A salient question is therefore if and how oscillations interact with spik...
متن کاملNeurophysiology of a VLSI Spiking Neural Network: LANN21
A recurrent network of 21 linear integrate-and-fire (LIF) neurons (14 excitatory; 7 inhibitory) connected by 60 spike-driven, excitatory, plastic synapses and 35 inhibitory synapses is implemented in analog VLSI. The connectivity pattern is random and at a level of 30%. The synaptic efficacies have two stable values as long term memory. Each neuron receives also an external afferent current. We...
متن کاملContext-dependent representation in recurrent neural networks
In order to assess the short-term memory performance of non-linear random neural networks, we introduce a measure to quantify the dependence of a neural representation upon the past context. We study this measure both numerically and theoretically using the mean-field theory for random neural networks, showing the existence of an optimal level of synaptic weights heterogeneity. We further inves...
متن کاملThe Random Neural Network: A Survey
The Random Neural Network (RNN) is a recurrent neural network model inspired by the spiking behaviour of biological neuronal networks. Contrary to most Artificial Neural Networks (ANN) models, neurons in RNN interact by probabilistically exchanging excitatory and inhibitory spiking signals. The model is described by analytical equations, has a low complexity supervised learning algorithm and is...
متن کاملIntrinsically-generated fluctuating activity in excitatory-inhibitory networks
Recurrent networks of non-linear units display a variety of dynamical regimes depending on the structure of their synaptic connectivity. A particularly remarkable phenomenon is the appearance of strongly fluctuating, chaotic activity in networks of deterministic, but randomly connected rate units. How this type of intrinsically generated fluctuations appears in more realistic networks of spikin...
متن کامل